A micrometer ( ), sometimes known as a micrometer screw gauge ( MSG), is a device incorporating a calibrated screw for accurate measurement of the size of components.Encyclopedia Americana (1988) "Micrometer" Encyclopedia Americana 19: 500 It is widely used in mechanical engineering, machining, metrology and most mechanical trades, along with other dimensional instruments such as dial, vernier, and digital calipers. Micrometers are usually, but not always, in the form of (opposing ends joined by a frame). The spindle is a very accurately machined screw and the object to be measured is placed between the spindle and the anvil. The spindle is moved by turning the ratchet knob or thimble until the object to be measured is lightly touched by both the spindle and the anvil.
The London Science Museum contains an exhibit "James Watt's end measuring instrument with micrometer screw, 1776" which the science museum claims is probably the first screw micrometer made. This instrument is intended to measure items very accurately by placing them between the two anvils and then advancing one using a fine micrometer screw until both are in contact with the object, the distance between them being precisely recorded on the two dials. However, as the science museum notes, there is a possibility that this instrument was not made c.1776 by Watt, but 1876 when it was placed in that year's Special Loan Exhibition of scientific instruments in South Kensington.
Henry Maudslay built a bench micrometer in the early 19th century that was nicknamed "the Lord Chancellor" among his staff because it was the final judge on measurement accuracy and precision in the firm's work. In 1844, details of Joseph Whitworth's workshop micrometer were published. This was described as having a strong frame of cast iron, the opposite ends of which were two highly finished steel cylinders, which traversed longitudinally by action of screws. The ends of the cylinders where they met was of hemispherical shape. One screw was fitted with a wheel graduated to measure to the ten thousandth of an inch. His object was to furnish ordinary mechanics with an instrument which, while it afforded very accurate indications, was yet not very liable to be deranged by the rough handling of the workshop.
The first documented development of handheld micrometer-screw was by Jean Laurent Palmer of Paris in 1848; Roe 1916:212. the device is therefore often called palmer in French, tornillo de Palmer ("Palmer screw") in Spanish, and calibro Palmer ("Palmer caliper") in Italian. (Those languages also use the micrometer cognates: micromètre, micrómetro, micrometro.) The micrometer caliper was introduced to the mass market in anglophone countries by Brown & Sharpe in 1867, Roe 1916:210-213, 215. allowing the penetration of the instrument's use into the average machine shop. Brown & Sharpe were inspired by several earlier devices, one of them being Palmer's design. In 1888, Edward W. Morley added to the precision of micrometric measurements and proved their accuracy in a complex series of experiments.
The culture of toolroom accuracy and precision, which started with interchangeability pioneers including Gribeauval, Tousard, Simeon North, Hall, Eli Whitney, and Samuel Colt, and continued through leaders such as Maudslay, Palmer, Joseph Whitworth, Brown, Sharpe, Pratt, Amos Whitney, Leland, Johansson, and others, grew during the Machine Age to become an important part of combining applied science with technology. Beginning in the early 20th century, one could no longer truly master tool and die making, machine tool building, or engineering without some knowledge of the science of metrology, as well as the sciences of chemistry and physics (for metallurgy, kinematics/dynamics, and quality).
For example, if the lead of a screw is 1 mm, but the major diameter (here, outer diameter) is 10 mm, then the circumference of the screw is 10π, or about 31.4 mm. Therefore, an axial movement of 1 mm is amplified (magnified) to a circumferential movement of 31.4 mm. This amplification allows a small difference in the sizes of two similar measured objects to correlate to a larger difference in the position of a micrometer's thimble. In some micrometers, even greater accuracy is obtained by using a differential screw adjuster to move the thimble in much smaller increments than a single thread would allow.
In classic-style analog micrometers, the position of the thimble is read directly from scale markings on the thimble and sleeve (for names of parts see next section). A vernier scale is often included, which allows the position to be read to a fraction of the smallest scale mark. In digital micrometers, an electronic readout displays the length digitally on an LCD on the instrument. There also exist mechanical-digit versions, like the style of car where the numbers "roll over".
A micrometer is composed of:
Suppose the thimble were screwed out so that graduation 2, and three additional sub-divisions, were visible on the sleeve (as shown in the image), and that graduation 1 on the thimble coincided with the axial line on the sleeve. The reading would then be 0.2000 + 0.075 + 0.001, or 0.276 inch.
As shown in the image, suppose that the thimble were screwed out so that graduation 5, and one additional 0.5 subdivision were visible on the sleeve. The reading from the axial line on the sleeve almost reaches graduation 28 on the thimble. The best estimate is 27.9 graduations. The reading then would be 5.00 (exact) + 0.5 (exact) + 0.279 (estimate) = 5.779 mm (estimate). As the last digit is an "estimated tenth", both 5.780 mm and 5.778 mm are also reasonably acceptable readings but the former cannot be written as 5.78 mm or, by the rules for significant figures, it is then taken to express ten times less precision than the instrument actually has! But note that the nature of the object being measured often requires one should round the result to fewer significant figures than which the instrument is capable.
The additional digit of these micrometers is obtained by finding the line on the sleeve vernier scale which exactly coincides with one on the thimble. The number of this coinciding vernier line represents the additional digit.
Thus, the reading for metric micrometers of this type is the number of whole millimeters (if any) and the number of hundredths of a millimeter, as with an ordinary micrometer, and the number of thousandths of a millimeter given by the coinciding vernier line on the sleeve vernier scale.
For example, a measurement of 5.783 millimetres would be obtained by reading 5.5 millimetres on the sleeve, and then adding 0.28 millimetre as determined by the thimble. The vernier would then be used to read the 0.003 (as shown in the image).
Inch micrometers are read in a similar fashion.
Note: 0.01 millimeter = 0.000393 inch, and 0.002 millimeter = 0.000078 inch (78 millionths) or alternatively, 0.0001 inch = 0.00254 millimeters. Therefore, metric micrometers provide smaller measuring increments than comparable inch unit micrometers—the smallest graduation of an ordinary inch reading micrometer is 0.001 inch; the vernier type has graduations down to 0.0001 inch (0.00254 mm). When using either a metric or inch micrometer, without a vernier, smaller readings than those graduated may of course be obtained by visual interpolation between graduations.
The accuracy of micrometers is checked by using them to measure ,BS EN ISO 3650: "Geometrical product specifications (GPS). Length standards. Gauge blocks" (1999) rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.75000 ± 0.00005 inch ("seven-fifty plus or minus fifty millionths", that is, "seven hundred fifty thou plus or minus half a tenth"), then the micrometer should measure it as 0.7500 inch. If the micrometer measures 0.7503 inch, then it is out of calibration. Cleanliness and low but consistent torque are especially important when calibrating—each tenth (that is, ten-thousandth of an inch), or hundredth of a millimetre, "counts"; each is important. A mere speck of dirt, or a mere bit too much squeeze, obscures the truth of whether the instrument can read correctly. The solution is simply —cleaning, patience, due care and attention, and repeated measurements (good repeatability assures the calibrator that their technique is working correctly).
Calibration typically checks the error at 3 to 5 points along the range. Only one can be adjusted to zero. If the micrometer is in good condition, then they are all so near to zero that the instrument seems to read essentially "-on" all along its range; no noticeable error is seen at any locale. In contrast, on a worn-out micrometer (or one that was poorly made to begin with), one can "chase the error up and down the range", that is, move it up or down to any of various locales along the range, by adjusting the sleeve, but one cannot eliminate it from all locales at once.
Calibration can also include the condition of the tips (flat and parallel), ratchet, and linearity of the scale. ITTC – Recommended Procedures : Sample Work Instructions Calibration of Micrometers. Flatness and parallelism are typically measured with a gauge called an optical flat, a disc of glass or plastic ground with extreme accuracy to have flat, parallel faces, which allows light bands to be counted when the micrometer's anvil and spindle are against it, revealing their amount of geometric inaccuracy.
Commercial machine shops, especially those that do certain categories of work (military or commercial aerospace, nuclear power industry, medical, and others), are required by various standards organizations (such as ISO, ANSI, ASME, ASME B89.1.13 - 2013 Micrometers. ASTM, SAE, AIA, the U.S. military, and others) to calibrate micrometers and other gauges on a schedule (often annually), to affix a label to each gauge that gives it an ID number and a calibration expiration date, to keep a record of all the gauges by ID number, and to specify in inspection reports which gauge was used for a particular measurement.
Not all calibration is an affair for metrology labs. A micrometer can be calibrated on-site anytime, at least in the most basic and important way (if not comprehensively), by measuring a high-grade gauge block and adjusting to match. Even gauges that are calibrated annually and within their expiration timeframe should be checked this way every month or two if they are used daily. They usually will check out OK as needing no adjustment.
The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard such as the international prototype of the meter. This bar of metal, like the international prototype of the kilogram, is maintained under controlled conditions at the International Bureau of Weights and Measures headquarters in France, which is one of the principal measurement standards laboratories of the world. These master standards have extreme-accuracy regional copies (kept in the national laboratories of various countries, such as NIST), and metrological equipment makes the chain of comparisons. Because the definition of the meter is now based on a light wavelength, the international prototype of the meter is not quite as indispensable as it once was. But such master gauges are still important for calibrating and certifying metrological equipment. Equipment described as "NIST traceable" means that its comparison against master gauges, and their comparison against others, can be traced back through a chain of documentation to equipment in the NIST labs. Maintaining this degree of traceability requires some expense, which is why NIST-traceable equipment is more expensive than non-NIST-traceable. But applications needing the highest degree of quality control mandate the cost.
|
|